Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 49
Filtrar
1.
J R Stat Soc Series B Stat Methodol ; 86(1): 177-193, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38344135

RESUMEN

The analysis of excursion sets in imaging data is essential to a wide range of scientific disciplines such as neuroimaging, climatology, and cosmology. Despite growing literature, there is little published concerning the comparison of processes that have been sampled across the same spatial region but which reflect different study conditions. Given a set of asymptotically Gaussian random fields, each corresponding to a sample acquired for a different study condition, this work aims to provide confidence statements about the intersection, or union, of the excursion sets across all fields. Such spatial regions are of natural interest as they directly correspond to the questions 'Where do all random fields exceed a predetermined threshold?', or 'Where does at least one random field exceed a predetermined threshold?'. To assess the degree of spatial variability present, our method provides, with a desired confidence, subsets and supersets of spatial regions defined by logical conjunctions (i.e. set intersections) or disjunctions (i.e. set unions), without any assumption on the dependence between the different fields. The method is verified by extensive simulations and demonstrated using task-fMRI data to identify brain regions with activation common to four variants of a working memory task.

2.
Ophthalmol Glaucoma ; 6(2): 147-159, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36038107

RESUMEN

PURPOSE: To investigate the efficacy of a deep learning regression method to predict macula ganglion cell-inner plexiform layer (GCIPL) and optic nerve head (ONH) retinal nerve fiber layer (RNFL) thickness for use in glaucoma neuroprotection clinical trials. DESIGN: Cross-sectional study. PARTICIPANTS: Glaucoma patients with good quality macula and ONH scans enrolled in 2 longitudinal studies, the African Descent and Glaucoma Evaluation Study and the Diagnostic Innovations in Glaucoma Study. METHODS: Spectralis macula posterior pole scans and ONH circle scans on 3327 pairs of GCIPL/RNFL scans from 1096 eyes (550 patients) were included. Participants were randomly distributed into a training and validation dataset (90%) and a test dataset (10%) by participant. Networks had access to GCIPL and RNFL data from one hemiretina of the probe eye and all data of the fellow eye. The models were then trained to predict the GCIPL or RNFL thickness of the remaining probe eye hemiretina. MAIN OUTCOME MEASURES: Mean absolute error (MAE) and squared Pearson correlation coefficient (r2) were used to evaluate model performance. RESULTS: The deep learning model was able to predict superior and inferior GCIPL thicknesses with a global r2 value of 0.90 and 0.86, r2 of mean of 0.90 and 0.86, and mean MAE of 3.72 µm and 4.2 µm, respectively. For superior and inferior RNFL thickness predictions, model performance was slightly lower, with a global r2 of 0.75 and 0.84, r2 of mean of 0.81 and 0.82, and MAE of 9.31 µm and 8.57 µm, respectively. There was only a modest decrease in model performance when predicting GCIPL and RNFL in more severe disease. Using individualized hemiretinal predictions to account for variability across patients, we estimate that a clinical trial can detect a difference equivalent to a 25% treatment effect over 24 months with an 11-fold reduction in the number of patients compared to a conventional trial. CONCLUSIONS: Our deep learning models were able to accurately estimate both macula GCIPL and ONH RNFL hemiretinal thickness. Using an internal control based on these model predictions may help reduce clinical trial sample size requirements and facilitate investigation of new glaucoma neuroprotection therapies. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found after the references.


Asunto(s)
Aprendizaje Profundo , Glaucoma , Humanos , Estudios Transversales , Neuroprotección , Presión Intraocular , Fibras Nerviosas , Campos Visuales , Células Ganglionares de la Retina , Tomografía de Coherencia Óptica/métodos , Ensayos Clínicos como Asunto , Glaucoma/diagnóstico
3.
bioRxiv ; 2023 Dec 13.
Artículo en Inglés | MEDLINE | ID: mdl-38168311

RESUMEN

Many recent studies have demonstrated the inflated type 1 error rate of the original Gaussian random field (GRF) methods for inference of neuroimages and identified resampling (permutation and bootstrapping) methods that have better performance. There has been no evaluation of resampling procedures when using robust (sandwich) statistical images with different topological features (TF) used for neuroimaging inference. Here, we consider estimation of distributions TFs of a statistical image and evaluate resampling procedures that can be used when exchangeability is violated. We compare the methods using realistic simulations and study sex differences in life-span age-related changes in gray matter volume in the Nathan Kline Institute Rockland sample. We find that our proposed wild bootstrap and the commonly used permutation procedure perform well in sample sizes above 50 under realistic simulations with heteroskedasticity. The Rademacher wild bootstrap has fewer assumptions than the permutation and performs similarly in samples of 100 or more, so is valid in a broader range of conditions. We also evaluate the GRF-based pTFCE method and show that it has inflated error rates in samples less than 200. Our R package, pbj , is available on Github and allows the user to reproducibly implement various resampling-based group level neuroimage analyses.

4.
Glob Chang Biol ; 28(24): 7327-7339, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-36117409

RESUMEN

We explore the ability of the atmospheric CO2 record since 1900 to constrain the source of CO2 from land use and land cover change (hereafter "land use"), taking account of uncertainties in other terms in the global carbon budget. We find that the atmospheric constraint favors land use CO2 flux estimates with lower decadal variability and can identify potentially erroneous features, such as emission peaks around 1960 and after 2000, in some published estimates. Furthermore, we resolve an offset in the global carbon budget that is most plausibly attributed to the land use flux. This correction shifts the mean land use flux since 1900 across 20 published estimates down by 0.35 PgC year-1 to 1.04 ± 0.57 PgC year-1 , which is within the range but at the low end of these estimates. We show that the atmospheric CO2 record can provide insights into the time history of the land use flux that may reduce uncertainty in this term and improve current understanding and projections of the global carbon cycle.


Asunto(s)
Dióxido de Carbono , Ecosistema , Ciclo del Carbono , Carbono , Incertidumbre
5.
J Geophys Res Atmos ; 127(13): e2021JD035892, 2022 Jul 16.
Artículo en Inglés | MEDLINE | ID: mdl-35864859

RESUMEN

Long-term measurements at the Mauna Loa Observatory (MLO) show that the CO2 seasonal cycle amplitude (SCA) increased from 1959 to 2019 at an overall rate of 0.22  ±  0.034 ppm decade-1 while also varying on interannual to decadal time scales. These SCA changes are a signature of changes in land ecological CO2 fluxes as well as shifting winds. Simulations with the TM3 tracer transport model and CO2 fluxes from the Jena CarboScope CO2 Inversion suggest that shifting winds alone have contributed to a decrease in SCA of -0.10  ±  0.022 ppm decade-1 from 1959 to 2019, partly offsetting the observed long-term SCA increase associated with enhanced ecosystem net primary production. According to these simulations and MIROC-ACTM simulations, the shorter-term variability of MLO SCA is nearly equally driven by varying ecological CO2 fluxes (49%) and varying winds (51%). We also show that the MLO SCA is strongly correlated with the Pacific Decadal Oscillation (PDO) due to varying winds, as well as with a closely related wind index (U-PDO). Since 1980, 44% of the wind-driven SCA decrease has been tied to a secular trend in the U-PDO, which is associated with a progressive weakening of westerly winds at 700 mbar over the central Pacific from 20°N to 40°N. Similar impacts of varying winds on the SCA are seen in simulations at other low-latitude Pacific stations, illustrating the difficulty of constraining trend and variability of land CO2 fluxes using observations from low latitudes due to the complexity of circulation changes.

6.
J Stat Plan Inference ; 216: 70-94, 2022 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-35813237

RESUMEN

We propose a construction of simultaneous confidence bands (SCBs) for functional parameters over arbitrary dimensional compact domains using the Gaussian Kinematic formula of t-processes (tGKF). Although the tGKF relies on Gaussianity, we show that a central limit theorem (CLT) for the parameter of interest is enough to obtain asymptotically precise covering even if the observations are non-Gaussian processes. As a proof of concept we study the functional signal-plus-noise model and derive a CLT for an estimator of the Lipshitz-Killing curvatures, the only data-dependent quantities in the tGKF. We further discuss extensions to discrete sampling with additive observation noise using scale space ideas from regression analysis. Our theoretical work is accompanied by a simulation study comparing different methods to construct SCBs for the population mean. We show that the tGKF outperforms state-of-the-art methods with precise covering for small sample sizes, and only a Rademacher multiplier-t bootstrap performs similarly well. A further benefit is that our SCBs are computational fast even for domains of dimension greater than one. Applications of SCBs to diffusion tensor imaging (DTI) fibers (1D) and spatio-temporal temperature data (2D) are discussed.

7.
J Multivar Anal ; 1922022 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-38094514

RESUMEN

Given a functional central limit (fCLT) for an estimator and a parameter transformation, we construct random processes, called functional delta residuals, which asymptotically have the same covariance structure as the limit process of the functional delta method. An explicit construction of these residuals for transformations of moment-based estimators and a multiplier bootstrap fCLT for the resulting functional delta residuals are proven. The latter is used to consistently estimate the quantiles of the maximum of the limit process of the functional delta method in order to construct asymptotically valid simultaneous confidence bands for the transformed functional parameters. Performance of the coverage rate of the developed construction, applied to functional versions of Cohen's d, skewness and kurtosis, is illustrated in simulations and their application to test Gaussianity is discussed.

8.
Neuroimage ; 226: 117477, 2021 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-33166643

RESUMEN

Current statistical inference methods for task-fMRI suffer from two fundamental limitations. First, the focus is solely on detection of non-zero signal or signal change, a problem that is exacerbated for large scale studies (e.g. UK Biobank, N=40,000+) where the 'null hypothesis fallacy' causes even trivial effects to be determined as significant. Second, for any sample size, widely used cluster inference methods only indicate regions where a null hypothesis can be rejected, without providing any notion of spatial uncertainty about the activation. In this work, we address these issues by developing spatial Confidence Sets (CSs) on clusters found in thresholded Cohen's d effect size images. We produce an upper and lower CS to make confidence statements about brain regions where Cohen's d effect sizes have exceeded and fallen short of a non-zero threshold, respectively. The CSs convey information about the magnitude and reliability of effect sizes that is usually given separately in a t-statistic and effect estimate map. We expand the theory developed in our previous work on CSs for %BOLD change effect maps (Bowring et al., 2019) using recent results from the bootstrapping literature. By assessing the empirical coverage with 2D and 3D Monte Carlo simulations resembling fMRI data, we find our method is accurate in sample sizes as low as N=60. We compute Cohen's d CSs for the Human Connectome Project working memory task-fMRI data, illustrating the brain regions with a reliable Cohen's d response for a given threshold. By comparing the CSs with results obtained from a traditional statistical voxelwise inference, we highlight the improvement in activation localization that can be gained with the Confidence Sets.


Asunto(s)
Encéfalo/diagnóstico por imagen , Conectoma/métodos , Imagen por Resonancia Magnética/métodos , Humanos , Tamaño de la Muestra
9.
Sci Rep ; 10(1): 20336, 2020 11 23.
Artículo en Inglés | MEDLINE | ID: mdl-33230152

RESUMEN

We propose a random forest classifier for identifying adequacy of liver MR images using handcrafted (HC) features and deep convolutional neural networks (CNNs), and analyze the relative role of these two components in relation to the training sample size. The HC features, specifically developed for this application, include Gaussian mixture models, Euler characteristic curves and texture analysis. Using HC features outperforms the CNN for smaller sample sizes and with increased interpretability. On the other hand, with enough training data, the combined classifier outperforms the models trained with HC features or CNN features alone. These results illustrate the added value of HC features with respect to CNNs, especially when insufficient data is available, as is often found in clinical studies.

10.
Eur J Radiol ; 124: 108837, 2020 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-31958630

RESUMEN

PURPOSE: To develop and evaluate the performance of a fully-automated convolutional neural network (CNN)-based algorithm to evaluate hepatobiliary phase (HBP) adequacy of gadoxetate disodium (EOB)-enhanced MRI. Secondarily, we explored the potential of the proposed CNN algorithm to reduce examination length by applying it to EOB-MRI examinations. METHODS: We retrospectively identified EOB-enhanced MRI-HBP series from examinations performed 2011-2018 (internal and external datasets). Our algorithm, comprising a liver segmentation and classification CNN, produces an adequacy score. Two abdominal radiologists independently classified series as adequate or suboptimal. The consensus determination of HBP adequacy was used as ground truth for CNN model training and validation. Reader agreement was evaluated with Cohen's kappa. Performance of the algorithm was assessed by receiver operating characteristics (ROC) analysis and computation of the area under the ROC curve (AUC). Potential examination duration reduction was evaluated descriptively. RESULTS: 1408 HBP series from 484 patients were included. Reader kappa agreement was 0.67 (internal dataset) and 0.80 (external dataset). AUCs were 0.97 (0.96-0.99) for internal and 0.95 (0.92-96) for external and were not significantly different from each other (p = 0.24). 48 % (50/105) examinations could have been shorter by applying the algorithm. CONCLUSION: A proposed CNN-based algorithm achieves higher than 95 % AUC for classifying HBP images as adequate versus suboptimal. The application of this algorithm could potentially shorten examination time and aid radiologists in recognizing technically suboptimal images, avoiding diagnostic pitfalls.


Asunto(s)
Medios de Contraste/farmacocinética , Gadolinio DTPA/farmacocinética , Interpretación de Imagen Asistida por Computador/métodos , Neoplasias Hepáticas/diagnóstico por imagen , Imagen por Resonancia Magnética/métodos , Redes Neurales de la Computación , Adulto , Anciano , Algoritmos , Área Bajo la Curva , Eficiencia , Femenino , Humanos , Hígado/diagnóstico por imagen , Masculino , Persona de Mediana Edad , Curva ROC , Estudios Retrospectivos , Tiempo , Flujo de Trabajo
11.
Eur Radiol Exp ; 3(1): 43, 2019 10 26.
Artículo en Inglés | MEDLINE | ID: mdl-31655943

RESUMEN

BACKGROUND: Liver alignment between series/exams is challenged by dynamic morphology or variability in patient positioning or motion. Image registration can improve image interpretation and lesion co-localization. We assessed the performance of a convolutional neural network algorithm to register cross-sectional liver imaging series and compared its performance to manual image registration. METHODS: Three hundred fourteen patients, including internal and external datasets, who underwent gadoxetate disodium-enhanced magnetic resonance imaging for clinical care from 2011 to 2018, were retrospectively selected. Automated registration was applied to all 2,663 within-patient series pairs derived from these datasets. Additionally, 100 within-patient series pairs from the internal dataset were independently manually registered by expert readers. Liver overlap, image correlation, and intra-observation distances for manual versus automated registrations were compared using paired t tests. Influence of patient demographics, imaging characteristics, and liver uptake function was evaluated using univariate and multivariate mixed models. RESULTS: Compared to the manual, automated registration produced significantly lower intra-observation distance (p < 0.001) and higher liver overlap and image correlation (p < 0.001). Intra-exam automated registration achieved 0.88 mean liver overlap and 0.44 mean image correlation for the internal dataset and 0.91 and 0.41, respectively, for the external dataset. For inter-exam registration, mean overlap was 0.81 and image correlation 0.41. Older age, female sex, greater inter-series time interval, differing uptake, and greater voxel size differences independently reduced automated registration performance (p ≤ 0.020). CONCLUSION: A fully automated algorithm accurately registered the liver within and between examinations, yielding better liver and focal observation co-localization compared to manual registration.


Asunto(s)
Algoritmos , Hígado/diagnóstico por imagen , Imagen por Resonancia Magnética/métodos , Redes Neurales de la Computación , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos
12.
Neuroimage ; 203: 116187, 2019 12.
Artículo en Inglés | MEDLINE | ID: mdl-31533067

RESUMEN

The mass-univariate approach for functional magnetic resonance imaging (fMRI) analysis remains a widely used statistical tool within neuroimaging. However, this method suffers from at least two fundamental limitations: First, with sufficient sample sizes there is high enough statistical power to reject the null hypothesis everywhere, making it difficult if not impossible to localize effects of interest. Second, with any sample size, when cluster-size inference is used a significant p-value only indicates that a cluster is larger than chance. Therefore, no notion of confidence is available to express the size or location of a cluster that could be expected with repeated sampling from the population. In this work, we address these issues by extending on a method proposed by Sommerfeld et al. (2018) (SSS) to develop spatial Confidence Sets (CSs) on clusters found in thresholded raw effect size maps. While hypothesis testing indicates where the null, i.e. a raw effect size of zero, can be rejected, the CSs give statements on the locations where raw effect sizes exceed, and fall short of, a non-zero threshold, providing both an upper and lower CS. While the method can be applied to any mass-univariate general linear model, we motivate the method in the context of blood-oxygen-level-dependent (BOLD) fMRI contrast maps for inference on percentage BOLD change raw effects. We propose several theoretical and practical implementation advancements to the original method formulated in SSS, delivering a procedure with superior performance in sample sizes as low as N=60. We validate the method with 3D Monte Carlo simulations that resemble fMRI data. Finally, we compute CSs for the Human Connectome Project working memory task contrast images, illustrating the brain regions that show a reliable %BOLD change for a given %BOLD threshold.


Asunto(s)
Mapeo Encefálico/métodos , Encéfalo/fisiología , Imagen por Resonancia Magnética , Interpretación Estadística de Datos , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Tamaño de la Muestra
13.
Neuroimage ; 197: 402-413, 2019 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-31028923

RESUMEN

Peaks are a mainstay of neuroimage analysis for reporting localization results. The current peak detection procedure in SPM12 requires a pre-threshold for approximating p-values and a false discovery rate (FDR) nominal level for inference. However, the pre-threshold is an undesirable feature, while the FDR level is meaningless if the null hypothesis is not properly defined. This article provides: 1) a peak height distribution for smooth Gaussian error fields, which does not require a screening pre-threshold; 2) a signal-plus-noise model where FDR of peaks can be controlled and properly interpreted. Matlab code for calculation of p-values using the exact peak height distribution is available as an SPM extension.


Asunto(s)
Mapeo Encefálico/métodos , Encéfalo/fisiología , Neuroimagen/métodos , Interpretación Estadística de Datos , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Distribución Normal , Reproducibilidad de los Resultados
14.
Ann Appl Stat ; 13(4): 2509-2538, 2019 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-38222269

RESUMEN

Analysis of genome-wide association studies (GWAS) is characterized by a large number of univariate regressions where a quantitative trait is regressed on hundreds of thousands to millions of single-nucleotide polymorphism (SNP) allele counts, one at a time. This article proposes an estimator of the SNP heritability of the trait, defined here as the fraction of the variance of the trait explained by the SNPs in the study. The proposed GWAS heritability (GWASH) estimator is easy to compute, highly interpretable, and is consistent as the number of SNPs and the sample size increase. More importantly, it can be computed from summary statistics typically reported in GWAS, not requiring access to the original data. The estimator takes full account of the linkage disequilibrium (LD) or correlation between the SNPs in the study through moments of the LD matrix, estimable from auxiliary datasets. Unlike other proposed estimators in the literature, we establish the theoretical properties of the GWASH estimator and obtain analytical estimates of the precision, allowing for power and sample size calculations for SNP heritability estimates, and forming a firm foundation for future methodological development.

15.
J Am Stat Assoc ; 113(523): 1327-1340, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-31452557

RESUMEN

The goal of this paper is to give confidence regions for the excursion set of a spatial function above a given threshold from repeated noisy observations on a fine grid of fixed locations. Given an asymptotically Gaussian estimator of the target function, a pair of data-dependent nested excursion sets are constructed that are sub- and super-sets of the true excursion set, respectively, with a desired confidence. Asymptotic coverage probabilities are determined via a multiplier bootstrap method, not requiring Gaussianity of the original data nor stationarity or smoothness of the limiting Gaussian field. The method is used to determine regions in North America where the mean summer and winter temperatures are expected to increase by mid 21st century by more than 2 degrees Celsius.

16.
Bernoulli (Andover) ; 24(4B): 3422-3446, 2018 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-31511762

RESUMEN

We obtain formulae for the expected number and height distribution of critical points of smooth isotropic Gaussian random fields parameterized on Euclidean space or spheres of arbitrary dimension. The results hold in general in the sense that there are no restrictions on the covariance function of the field except for smoothness and isotropy. The results are based on a characterization of the distribution of the Hessian of the Gaussian field by means of the family of Gaussian orthogonally invariant (GOI) matrices, of which the Gaussian orthogonal ensemble (GOE) is a special case. The obtained formulae depend on the covariance function only through a single parameter (Euclidean space) or two parameters (spheres), and include the special boundary case of random Laplacian eigenfunctions.

17.
Artículo en Inglés | MEDLINE | ID: mdl-28936474

RESUMEN

We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP) climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

18.
J Med Imaging (Bellingham) ; 4(2): 024006, 2017 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-28612035

RESUMEN

An important challenge to using fluorodeoxyglucose-positron emission tomography (FDG-PET) in clinical trials of brain tumor patients is to identify malignant regions whose metabolic activity shows significant changes between pretreatment and a posttreatment scans in the presence of high normal brain background metabolism. This paper describes a semiautomated processing and analysis pipeline that is able to detect such changes objectively with a given false detection rate. Image registration and voxelwise comparison of the pre- and posttreatment images were performed. A key step is adjustment of the observed difference by the estimated background change at each voxel, thereby overcoming the confounding effect of spatially heterogeneous metabolic activity in the brain. Components of the proposed method were validated via phantom experiments and computer simulations. It achieves a false response volume accuracy of 0.4% at a significance threshold of 3 standard deviations. It is shown that the proposed methodology can detect lesion response with 100% accuracy with a tumor-to-background-ratio as low as 1.5, and it is not affected by the background brain glucose metabolism change. We also applied the method to FDG-PET patient images from a clinical trial to assess treatment effects of lapatinib, which demonstrated significant changes in metabolism corresponding to tumor regions.

19.
Commun Stat Simul Comput ; 46(6): 4851-4879, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-31452576

RESUMEN

This paper presents nonparametric two-sample bootstrap tests formeans of randomsymmetric positivedefinite (SPD) matrices according to two differentmetrics: the Frobenius (or Euclidean)metric, inherited from the embedding of the set of SPD metrics in the Euclidean set of symmetric matrices, and the canonical metric, which is defined without an embedding and suggests an intrinsic analysis. A fast algorithm is used to compute the bootstrap intrinsic means in the case of the latter. The methods are illustrated in a simulation study and applied to a two-group comparison of means of diffusion tensors (DTs) obtained from a single voxel of registered DT images of children in a dyslexia study.

20.
Commun Stat Simul Comput ; 46(1): 127-144, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-31501637

RESUMEN

Feature extraction from observed noisy samples is a common important problem in statistics and engineering. This paper presents a novel general statistical approach to the region detection problem in long data sequences. The proposed technique is a multi-scale kernel regression in conjunction with statistical multiple testing for region detection while controlling the false discovery rate (FDR) and maximizing the signal to noise ratio (SNR) via matched filtering. This is achieved by considering a one-dimensional (1D) region detection problem as its equivalent 0D (zero dimensional) peak detection problem. The detection method does not require a priori knowledge of the shape of the non-zero regions. However, if the shape of the non-zero regions is known a priori, e.g. rectangular pulse, the signal regions can also be reconstructed from the detected peaks, seen as their topological point representatives. Simulations show that the method can effectively perform signal detection and reconstruction in the simulated data under high noise conditions, while controlling the FDR of detected regions and their reconstructed length.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...